hysop.operator.dummy module

Dummy operator. To be used for topology fixing.

class hysop.operator.dummy.Dummy(implementation=None, base_kwds=None, candidate_input_tensors=None, candidate_output_tensors=None, **impl_kwds)[source]

Bases: ComputationalGraphNodeFrontend

Initialize a ComputationalGraphNodeFrontend

Parameters:
  • implementation (Implementation, optional, defaults to None) – target implementation, should be contained in available_implementations(). If None, implementation will be set to default_implementation().

  • base_kwds (dict, optional, defaults to None) – Base class keywords arguments. If None, an empty dict will be passed.

  • impl_kwds – Keywords arguments that will be passed towards implementation implemention __init__.

implementation

the implementation key

Type:

Implementation

backend

the backend corresponding to implementation

Type:

Backend

impl

the implementation class

Type:

ComputationalGraphNodeGenerator or ComputationalGraphNode

impl_kwds

Keywords arguments that will be passed towards implementation implemention impl.__init__ during a call to _generate.

classmethod default_implementation()[source]

Return the default Implementation, should be compatible with available_implementations.

classmethod implementations()[source]

Should return all implementations as a dictionnary. Keys are Implementation instances and values are either ComputationalGraphNode or ComputationalGraphNodeGenerator.

class hysop.operator.dummy.OpenClDummy(variables, **kwds)[source]

Bases: OpenClOperator

Create the common attributes of all OpenCL operators. See handle_method() and setup().

All input and output variable topologies should be of kind Backend.OPENCL and share the same OpenClEnvironment.

cl_env

OpenCL environment shared accross all topologies.

Type:

OpenClEnvironment

Notes

About method keys:
OpenClKernelConfig: user build options, defines, precision

and autotuner configuration

apply(**kwds)

Abstract method that should be implemented. Applies this node (operator, computational graph operator…).

get_field_requirements()[source]

Called just after handle_method(), ie self.method has been set. topology requirements are:

  1. min and max ghosts for each input and output variables

  2. allowed splitting directions for cartesian topologies

3) required local and global transposition state, if any. and more

they are stored in self.input_field_requirements and self.output_field_requirements.

keys are continuous fields and values are of type hysop.fields.field_requirement.discretefieldrequirements

default is backend.opencl, no min or max ghosts and no specific transposition state for each input and output variables.

classmethod supports_mpi()[source]

Return True if this operator was implemented to support multiple mpi processes.

class hysop.operator.dummy.PythonDummy(variables, **kwds)[source]

Bases: HostOperator

Create the common attributes of all host operators.

All input and output variable topologies should be of kind Backend.HOST and share the same HostEnvironment.

apply(**kwds)

Abstract method that should be implemented. Applies this node (operator, computational graph operator…).

get_field_requirements()[source]

Called just after handle_method(), ie self.method has been set. Field requirements are:

  1. required local and global transposition state, if any.

  2. required memory ordering (either C or Fortran)

Default is Backend.HOST, no min or max ghosts, MemoryOrdering.ANY and no specific default transposition state for each input and output variables.

classmethod supports_mpi()[source]

Return True if this operator was implemented to support multiple mpi processes.